Optimizer (machine learning)
A specific implementation of the gradient descent algorithm. Popular optimizers include:1
- AdaGrad, which stands for ADAptive GRADient descent.
- Adam, which stands for ADAptive with Momentum.
A specific implementation of the gradient descent algorithm. Popular optimizers include:1